High-dimensional model recovery from random sketched data by exploring intrinsic sparsity
نویسندگان
چکیده
منابع مشابه
Classification by ensembles from random partitions of high-dimensional data
A robust classification procedure is developed based on ensembles of classifiers, with each classifier constructed from a different set of predictors determined by a random partition of the entire set of predictors. The proposed methods combine the results of multiple classifiers to achieve a substantially improved prediction compared to the optimal single classifier. This approach is designed ...
متن کاملSharp thresholds for high-dimensional and noisy recovery of sparsity
The problem of consistently estimating the sparsity pattern of a vector β∗ ∈ R based on observations contaminated by noise arises in various contexts, including subset selection in regression, structure estimation in graphical models, sparse approximation, and signal denoising. We analyze the behavior of l1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering th...
متن کاملEecient Recovery of Low-dimensional Structure from High-dimensional Data
Many modeling tasks in computer vision. e.g. structure from motion, shape/reeectance from shading , lter synthesis have a low-dimensional intrinsic structure even though the dimension of the input data can be relatively large. We propose a simple but surprisingly eeective iterative randomized algorithm that drastically cuts down the time required for recovering the intrinsic structure. The comp...
متن کاملEfficient Recovery of Low-Dimensional Structure from High-Dimensional Data
Many modeling tasks in computer vision. e.g. structure from motion, shape/re ectance from shading, lter synthesis have a low-dimensional intrinsic structure even though the dimension of the input data can be relatively large. We propose a simple but surprisingly e ective iterative randomized algorithm that drastically cuts down the time required for recovering the intrinsic structure. The compu...
متن کاملHigh-dimensional Joint Sparsity Random Effects Model for Multi-task Learning
Joint sparsity regularization in multi-task learning has attracted much attention in recent years. The traditional convex formulation employs the group Lasso relaxation to achieve joint sparsity across tasks. Although this approach leads to a simple convex formulation, it suffers from several issues due to the looseness of the relaxation. To remedy this problem, we view jointly sparse multi-tas...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Machine Learning
سال: 2020
ISSN: 0885-6125,1573-0565
DOI: 10.1007/s10994-019-05865-4